924 resultados para model-based testing


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Model-based testing (MBT) relies on models of a system under test and/or its environment to derive test cases for the system. This paper discusses the process of MBT and defines a taxonomy that covers the key aspects of MBT approaches. It is intended to help with understanding the characteristics, similarities and differences of those approaches, and with classifying the approach used in a particular MBT tool. To illustrate the taxonomy, a description of how three different examples of MBT tools fit into the taxonomy is provided.

Relevância:

100.00% 100.00%

Publicador:

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The behavior of composed Web services depends on the results of the invoked services; unexpected behavior of one of the invoked services can threat the correct execution of an entire composition. This paper proposes an event-based approach to black-box testing of Web service compositions based on event sequence graphs, which are extended by facilities to deal not only with service behavior under regular circumstances (i.e., where cooperating services are working as expected) but also with their behavior in undesirable situations (i.e., where cooperating services are not working as expected). Furthermore, the approach can be used independently of artifacts (e.g., Business Process Execution Language) or type of composition (orchestration/choreography). A large case study, based on a commercial Web application, demonstrates the feasibility of the approach and analyzes its characteristics. Test generation and execution are supported by dedicated tools. Especially, the use of an enterprise service bus for test execution is noteworthy and differs from other approaches. The results of the case study encourage to suggest that the new approach has the power to detect faults systematically, performing properly even with complex and large compositions. Copyright © 2012 John Wiley & Sons, Ltd.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

1-D engine simulation models are widely used for the analysis and verification of air-path design concepts and prediction of the resulting engine transient response. The latter often requires closed loop control over the model to ensure operation within physical limits and tracking of reference signals. For this purpose, a particular implementation of Model Predictive Control (MPC) based on a corresponding Mean Value Engine Model (MVEM) is reported here. The MVEM is linearised on-line at each operating point to allow for the formulation of quadratic programming (QP) problems, which are solved as the part of the proposed MPC algorithm. The MPC output is used to control a 1-D engine model. The closed loop performance of such a system is benchmarked against the solution of a related optimal control problem (OCP). As an example this study is focused on the transient response of a light-duty car Diesel engine. For the cases examined the proposed controller implementation gives a more systematic procedure than other ad-hoc approaches that require considerable tuning effort. © 2012 IFAC.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Formal specifications can precisely and unambiguously define the required behavior of a software system or component. However, formal specifications are complex artifacts that need to be verified to ensure that they are consistent, complete, and validated against the requirements. Specification testing or animation tools exist to assist with this by allowing the specifier to interpret or execute the specification. However, currently little is known about how to do this effectively. This article presents a framework and tool support for the systematic testing of formal, model-based specifications. Several important generic properties that should be satisfied by model-based specifications are first identified. Following the idea of mutation analysis, we then use variants or mutants of the specification to check that these properties are satisfied. The framework also allows the specifier to test application-specific properties. All properties are tested for a range of states that are defined by the tester in the form of a testgraph, which is a directed graph that partially models the states and transitions of the specification being tested. Tool support is provided for the generation of the mutants, for automatically traversing the testgraph and executing the test cases, and for reporting any errors. The framework is demonstrated on a small specification and its application to three larger specifications is discussed. Experience indicates that the framework can be used effectively to test small to medium-sized specifications and that it can reveal a significant number of problems in these specifications.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Design verification in the digital domain, using model-based principles, is a key research objective to address the industrial requirement for reduced physical testing and prototyping. For complex assemblies, the verification of design and the associated production methods is currently fragmented, prolonged and sub-optimal, as it uses digital and physical verification stages that are deployed in a sequential manner using multiple systems. This paper describes a novel, hybrid design verification methodology that integrates model-based variability analysis with measurement data of assemblies, in order to reduce simulation uncertainty and allow early design verification from the perspective of satisfying key assembly criteria.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper provides a first look at the acceptance of Accountable-eHealth systems, a new genre of eHealth systems, designed to manage information privacy concerns that hinder the proliferation of eHealth. The underlying concept of AeH systems is appropriate use of information through after-the-fact accountability for intentional misuse of information by healthcare professionals. An online questionnaire survey was utilised for data collection from three educational institutions in Queensland, Australia. A total of 23 hypothesis relating to 9 constructs were tested using a structural equation modelling technique. A total of 334 valid responses were received. The cohort consisted of medical, nursing and other health related students studying at various levels in both undergraduate and postgraduate courses. The hypothesis testing disproved 7 hypotheses. The empirical research model developed was capable of predicting 47.3% of healthcare professionals’ perceived intention to use AeH systems. A validation of the model with a wider survey cohort would be useful to confirm the current findings.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Length scale-down (LS) model tests have been traditionally employed for laboratory studies on aeolian vibration of transmission line conductors. The span adopted is normally 30 m and is recommended by the relevant Indian, as well as other, standards. The traditionally adopted length of the LS model is reexamined herein to establish the rationale behind the choice. Based on the theoretical studies discussed, certain guidelines for the choice of model span of conductor are emphasized. In addition, the adequacy of the LS span as a tool for predicting the performance of the full span is reestablished.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper provides a first look at the acceptance of Accountable-eHealth (AeH) systems–a new genre of eHealth systems designed to manage information privacy concerns that hinder the proliferation of eHealth. The underlying concept of AeH systems is appropriate use of information through after-the-fact accountability for intentional misuse of information by healthcare professionals. An online questionnaire survey was utilised for data collection from three educational institutions in Queensland, Australia. A total of 23 hypotheses relating to 9 constructs were tested using a structural equation modelling technique. The moderation effects on the hypotheses were also tested based on six moderation factors to understand their role on the designed research model. A total of 334 valid responses were received. The cohort consisted of medical, nursing and other health related students studying at various levels in both undergraduate and postgraduate courses. Hypothesis testing provided sufficient data to accept 7 hypotheses. The empirical research model developed was capable of predicting 47.3% of healthcare professionals’ perceived intention to use AeH systems. All six moderation factors showed significant influence on the research model. A validation of this model with a wider survey cohort is recommended as a future study.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Concentrating solar power is an important way of providing renewable energy. Model simulation approaches play a fundamental role in the development of this technology and, for this, an accurately validation of the models is crucial. This work presents the validation of the heat loss model of the absorber tube of a parabolic trough plant by comparing the model heat loss estimates with real measurements in a specialized testing laboratory. The study focuses on the implementation in the model of a physical-meaningful and widely valid formulation of the absorber total emissivity depending on the surface’s temperature. For this purpose, the spectral emissivity of several absorber’s samples are measured and, with these data, the absorber total emissivity curve is obtained according to Planck function. This physical-meaningful formulation is used as input parameter in the heat loss model and a successful validation of the model is performed. Since measuring the spectral emissivity of the absorber surface may be complex and it is sample-destructive, a new methodology for the absorber’s emissivity characterization is proposed. This methodology provides an estimation of the absorber total emissivity, retaining its physical meaning and widely valid formulation according to Planck function with no need for direct spectral measurements. This proposed method is also successfully validated and the results are shown in the present paper.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Face recognition with unknown, partial distortion and occlusion is a practical problem, and has a wide range of applications, including security and multimedia information retrieval. The authors present a new approach to face recognition subject to unknown, partial distortion and occlusion. The new approach is based on a probabilistic decision-based neural network, enhanced by a statistical method called the posterior union model (PUM). PUM is an approach for ignoring severely mismatched local features and focusing the recognition mainly on the reliable local features. It thereby improves the robustness while assuming no prior information about the corruption. We call the new approach the posterior union decision-based neural network (PUDBNN). The new PUDBNN model has been evaluated on three face image databases (XM2VTS, AT&T and AR) using testing images subjected to various types of simulated and realistic partial distortion and occlusion. The new system has been compared to other approaches and has demonstrated improved performance.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

1. Quantitative reconstruction of past vegetation distribution and abundance from sedimentary pollen records provides an important baseline for understanding long term ecosystem dynamics and for the calibration of earth system process models such as regional-scale climate models, widely used to predict future environmental change. Most current approaches assume that the amount of pollen produced by each vegetation type, usually expressed as a relative pollen productivity term, is constant in space and time.
2. Estimates of relative pollen productivity can be extracted from extended R-value analysis (Parsons and Prentice, 1981) using comparisons between pollen assemblages deposited into sedimentary contexts, such as moss polsters, and measurements of the present day vegetation cover around the sampled location. Vegetation survey method has been shown to have a profound effect on estimates of model parameters (Bunting and Hjelle, 2010), therefore a standard method is an essential pre-requisite for testing some of the key assumptions of pollen-based reconstruction of past vegetation; such as the assumption that relative pollen productivity is effectively constant in space and time within a region or biome.
3. This paper systematically reviews the assumptions and methodology underlying current models of pollen dispersal and deposition, and thereby identifies the key characteristics of an effective vegetation survey method for estimating relative pollen productivity in a range of landscape contexts.
4. It then presents the methodology used in a current research project, developed during a practitioner workshop. The method selected is pragmatic, designed to be replicable by different research groups, usable in a wide range of habitats, and requiring minimum effort to collect adequate data for model calibration rather than representing some ideal or required approach. Using this common methodology will allow project members to collect multiple measurements of relative pollen productivity for major plant taxa from several northern European locations in order to test the assumption of uniformity of these values within the climatic range of the main taxa recorded in pollen records from the region.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper presents a novel approach to WLAN propagation models for use in indoor localization. The major goal of this work is to eliminate the need for in situ data collection to generate the Fingerprinting map, instead, it is generated by using analytical propagation models such as: COST Multi-Wall, COST 231 average wall and Motley- Keenan. As Location Estimation Algorithms kNN (K-Nearest Neighbour) and WkNN (Weighted K-Nearest Neighbour) were used to determine the accuracy of the proposed technique. This work is based on analytical and measurement tools to determine which path loss propagation models are better for location estimation applications, based on Receive Signal Strength Indicator (RSSI).This study presents different proposals for choosing the most appropriate values for the models parameters, like obstacles attenuation and coefficients. Some adjustments to these models, particularly to Motley-Keenan, considering the thickness of walls, are proposed. The best found solution is based on the adjusted Motley-Keenan and COST models that allows to obtain the propagation loss estimation for several environments.Results obtained from two testing scenarios showed the reliability of the adjustments, providing smaller errors in the measured values values in comparison with the predicted values.